A quadratically convergent Newton method for vector optimization

نویسندگان

  • L. M. Graña Drummond
  • F. M.P. Raupp
  • B. F. Svaiter
چکیده

We propose a Newton method for solving smooth unconstrained vector optimization problems under partial orders induced by general closed convex pointed cones. The method extends the one proposed by Fliege, Graña Drummond and Svaiter for multicriteria, which in turn is an extension of the classical Newton method for scalar optimization. The steplength is chosen by means of an Armijo-like rule, guaranteeing an objective value decrease at each iteration. Under standard assumptions, we establish superlinear convergence to an efficient point. Additionally, as in the scalar case, assuming Lipschitz continuity of the second derivative of the objective vector-valued function, we prove q-quadratic convergence.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Quadratically Convergent Interior-Point Algorithm for the P*(κ)-Matrix Horizontal Linear Complementarity Problem

In this paper, we present a new path-following interior-point algorithm for -horizontal linear complementarity problems (HLCPs). The algorithm uses only full-Newton steps which has the advantage that no line searchs are needed. Moreover, we obtain the currently best known iteration bound for the algorithm with small-update method, namely, , which is as good as the linear analogue.

متن کامل

A Quadratically Convergent Newton Method for Computing the Nearest Correlation Matrix

The nearest correlation matrix problem is to find a correlation matrix which is closest to a given symmetric matrix in the Frobenius norm. The well-studied dual approach is to reformulate this problem as an unconstrained continuously differentiable convex optimization problem. Gradient methods and quasi-Newton methods such as BFGS have been used directly to obtain globally convergent methods. S...

متن کامل

On Newton's Method for the Fermat-Weber Location Problem

This paper considers the Fermat-Weber location problem. It is shown that, after a suitable initialization, the standard Newton method can be applied to the Fermat-Weber problem and is globally and locally quadratically convergent. A numerical comparison with the popular Weiszfeld algorithm shows that Newton’s method is significantly more efficient than the Weiszfeld scheme.

متن کامل

Armijo Newton method for convex best interpolation

More than a decade agao, Newton’s method has been proposed for constructing the convex best interpolant. Its local quadratic convergence has only been established recently by recasting it as the generalized Newton method for semismooth equations. It still remains mysterious that the Newton method coupled with line search strategies works practically well in global sense. Similar to the classica...

متن کامل

Neural Network Learning Through Optimally Conditioned Quadratically Convergent Methods Requiring NO LINE SEARCH

Neural Network Learning algorithms based on Conjugate Gradient Techniques and Quasi Newton Techniques such as Broyden, DFP, BFGS, and SSVM algorithms require exact or inexact line searches in order to satisfy their convergence criteria. Line searches are very costly and slow down the learning process. This paper will present new Neural Network learning algorithms based on Hoshino's weak line se...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011